Dynamic Decision Making (DDM) is a skill that is both increasingly required and difficult to train for military commanders in today's security environment. Because it requires the timely sequencing of interdependent decisions in order to control complex and non-linear systems, DDM is a difficult skill to acquire for humans. Microworlds, stripped down simulations that focus on the dynamics of the target systems, have been proposed by many as training environments for DDM that avoid the time commitment, cost and personal danger of training command decision making with full-scale exercises or through mission experience. However, little research has been conducted on the factors that lead to effective microworld-based training. Specifically, it is unknown whether the time compression that occurs in microworlds enhances or inhibits the learning and transfer of complex system dynamics. A pilot study was conducted to examine whether participants are able to learn a simple DDM task in an accelerated microworld environment and then perform the same task in a similar but much slower environment. The results suggest that compressed-time microworlds can support training and transfer of DDM skills to “real-time” environments but that much remains to be learned about the conditions that favour the learning of DDM skills. Based on these results, general considerations for training DDM with microworlds and specific recommendations for improving the current study are provided.